1,524 research outputs found

    Oriented tensor reconstruction: tracing neural pathways from diffusion tensor MRI

    Get PDF
    In this paper we develop a new technique for tracing anatomical fibers from 3D tensor fields. The technique extracts salient tensor features using a local regularization technique that allows the algorithm to cross noisy regions and bridge gaps in the data. We applied the method to human brain DT-MRI data and recovered identifiable anatomical structures that correspond to the white matter brain-fiber pathways. The images in this paper are derived from a dataset having 121x88x60 resolution. We were able to recover fibers with less than the voxel size resolution by applying the regularization technique, i.e., using a priori assumptions about fiber smoothness. The regularization procedure is done through a moving least squares filter directly incorporated in the tracing algorithm

    Lead field basis for FEM source localization

    Get PDF
    Journal ArticleIn recent years, significant progress has been made in the area of EEG/MEG source imaging. source imaging on simple spherical models has become increasingly efficient, with consistently reported accuracy of within 5mm. In contrast, source localization on realistic head models remains slow, with sub-centimeter accuracy being the exception rather than the norm. A primary reason for this discrepancy is that most source imaging techniques are based on lead-fields. While the lead-field for simplified geometries can be easily computed analytically, an efficient method for computing realistic domain lead-fields has, until now, remained elusive. In this paper, we propose two efficient methods for computing realistic EEG lead-field bases: the first is element-oriented, and the second is node-oriented. We compare these two bases, discuss how they can be used to apply recent source imaging methods to realistic models, and report timings for constructing the bases

    Localization of multiple deep epileptic sources in a realistic head model via independent component analysis

    Get PDF
    Journal ArticleEstimating the location and distribution of current sources within the brain from electroencephalographic (EEG) recordings is an ill-posed inverse problem. The ill-posedness of the problem is due to a lack of uniqueness in the solution; that is, different configurations of sources can generate identical external fields. Additionally, the existence of only a finite number of scalp measurements increases the under-determined nature of this problem. Most source localization algorithms attempt to solve the inverse problem by fitting the potenials created on the scalp from multiple dipoles to a single time step of EEG measurements. In this paper we consider a spatio-temporal model and exploit the assumption that the EEG signal is composed of contributions from statistically independent sources. Under this assumption, we can apply the recently derived blind source separation algorithm (BSS), also referred as to Independent Component Analysis (ICA). This algorithm separates multichannel EEG data into temporally independent activation maps due to stationary sources. For our study, we use a 64 channel EEG recording of a multi-focal epileptic event and a realistic geometric model of the cranial volume derived from MRI data. The original ICA algorithm required the number of sources to be equal to the number of recorded channels and becomes unstable otherwise. In this paper, we propose a novel approach for solving this problem through the reduction of the data subspace. Specifically, we discard eigenvectors with small eigenvalues from a PCA analysis of the data prior to ICA decomposition. Our results show that using these proposed subspace reduction methods, multi-focal epileptic data can be successfully decomposed into several independent activation maps. For each activation map we perform a separate source localization procedure, looking only for a single dipole using a multistart downhill simplex method. The localized sources are found to be located and oriented at physiologically appropriate positions within the brain

    Statistical analysis for FEM EEG source localization in realistic head models

    Get PDF
    Journal ArticleEstimating the location and distribution of electric current sources within the brain from electroencephalographic (EEG) recordings is an ill-posed inverse problem. The ill-posed nature of the inverse EEG problem is due to the lack of a unique solution such that different configurations of sources can generate identical external electric fields. In this paper we consider a spatio-temporal model, taking advantage of the entire EEG time series to reduce the extent of the configuration space we must evaluate. We apply the recently derived informx algorithm for performing Independent Component Analysis (ICA) on the time-dependent EEG data. This algorithm separates multichannel EEG data into activation maps due to temporally independent stationary sources. For every activation map we perform a source localization procedure, looking only for a single dipole per map, thus dramatically reducing the search complexity. An added benefit of our ICA preprocessing step is the we obtain an a priori estimation of the number of independent sources producing the measured signal

    High throughput biofiltration for odour control at water purification plant

    Get PDF
    [Abstract] A high throughput trickling biofilter for odour control was designed basing on the principles of biotrickling filter technology developed in Moscow Bakh Institute of Biochemistry. All the necessary blocks except a fan: temperature and humidity control unit, a biofilter bed, an irrigation system, a control block and display unit are combined within one compact biofiltration module – a standard container 6000x2400x2400 mm. The plant is thermo-insulated that enables outdoor installation. The biofilter is easily scaled up by adding extra filtration beds. A typical biofiltration module rated for 5,000-10,000 m3/h has a contact time of 3-6 s (biofilter bed total volume – 10.5 m3) and a maximum footprint of 14.5 m2. After extensive pilot plant studies the first 5000 m3/h trickling biofilter easily scalable to 20000 m3/h was installed at Moscow Water Works in spring 2007 to control odour emissions - hydrogen sulfide, mercaptanes and other malodorous volatile organic compounds in up to 60 mg/m3 concentration. The performance results of the industrial biofilter are discusse

    SensorSCAN: Self-Supervised Learning and Deep Clustering for Fault Diagnosis in Chemical Processes

    Full text link
    Modern industrial facilities generate large volumes of raw sensor data during the production process. This data is used to monitor and control the processes and can be analyzed to detect and predict process abnormalities. Typically, the data has to be annotated by experts in order to be used in predictive modeling. However, manual annotation of large amounts of data can be difficult in industrial settings. In this paper, we propose SensorSCAN, a novel method for unsupervised fault detection and diagnosis, designed for industrial chemical process monitoring. We demonstrate our model's performance on two publicly available datasets of the Tennessee Eastman Process with various faults. The results show that our method significantly outperforms existing approaches (+0.2-0.3 TPR for a fixed FPR) and effectively detects most of the process faults without expert annotation. Moreover, we show that the model fine-tuned on a small fraction of labeled data nearly reaches the performance of a SOTA model trained on the full dataset. We also demonstrate that our method is suitable for real-world applications where the number of faults is not known in advance. The code is available at https://github.com/AIRI-Institute/sensorscan

    Towards Computationally Feasible Deep Active Learning

    Full text link
    Active learning (AL) is a prominent technique for reducing the annotation effort required for training machine learning models. Deep learning offers a solution for several essential obstacles to deploying AL in practice but introduces many others. One of such problems is the excessive computational resources required to train an acquisition model and estimate its uncertainty on instances in the unlabeled pool. We propose two techniques that tackle this issue for text classification and tagging tasks, offering a substantial reduction of AL iteration duration and the computational overhead introduced by deep acquisition models in AL. We also demonstrate that our algorithm that leverages pseudo-labeling and distilled models overcomes one of the essential obstacles revealed previously in the literature. Namely, it was shown that due to differences between an acquisition model used to select instances during AL and a successor model trained on the labeled data, the benefits of AL can diminish. We show that our algorithm, despite using a smaller and faster acquisition model, is capable of training a more expressive successor model with higher performance.Comment: Accepted at NAACL-2022 Finding

    Heart-muscle fiber reconstruction from diffusion tensor MRI

    Get PDF
    In this paper we use advanced tensor visualization techniques to study 3D diffusion tensor MRI data of a heart. We use scalar and tensor glyph visualization methods to investigate the data and apply a moving least squares (MLS) fiber tracing method to recover and visualize the helical structure and the orientation of the heart muscle fibers
    • …
    corecore